Goto

Collaborating Authors

 reduce deep learning training time


Reduce deep learning training time and cost with MosaicML Composer on AWS

#artificialintelligence

In the past decade, we have seen Deep learning (DL) science adopted at a tremendous pace by AWS customers. The plentiful and jointly trained parameters of DL models have a large representational capacity that brought improvements in numerous customer use cases, including image and speech analysis, natural language processing (NLP), time series processing, and more. In this post, we highlight challenges commonly reported specifically in DL training, and how the open-source library MosaicML Composer helps solve them. DL models are trained iteratively, in a nested for loop. A loop iterates through the training dataset chunk by chunk and, if necessary, this loop is repeated several times over the whole dataset.

  composer, mosaicml composer, reduce deep learning training time, (11 more...)
  Country: Europe > France (0.05)
  Industry: Retail > Online (0.40)